Web Survey Bibliography
Online surveys offer a lack of social interaction. In the relevant literature this is most often seen as an advantage (e.g. Tourangeau, Rips, & Rasinski, 2000). Besides being a potential source of bias, human communication is also a source of motivation. The absence of it could therefore be a contributing factor for the intrinsically high drop-out rates of online surveys.
The Social Interface Theory (e.g. Nass, Moon, & Green, 1997) states that even marginal human cues in an interface can induce behaviours that are commonly found in face-to-face contact. Within an experimental setting (n=2046), we attempted to simulate this social interaction in an online survey.
This was carried out by 1) addressing the participants directly with additional texts on several pages of the questionnaire. Within these texts, participants were, for example, thanked for their participation or shown appreciation for answering long questions. As further experimental conditions, the texts were either presented by 2) a static or 3) an animated virtual person (avatar). One of ten avatars, which varied in gender and age, was chosen at random. Additionally, using a 3x2 design participants were addressed personally using their names. A condition without additional texts acted as the control condition.
No significant influences on drop-out behaviour (rate, time or point of drop-out) and answer behaviour (e.g. social desirability, indicators of data quality or questionnaire ratings) can be found. Significant interaction effects between the gender of the avatar and that of the participant can be seen for some questions concerning gender roles. Further to this, it can be seen that males, younger participants and those participants with a high level of social discomfort react to the human cues more negatively.
One possible reason that the results do not confirm the hypotheses could result from the very high interest in the topic. This interest could also explain the very low drop-out rate, despite the length of the questionnaire (approximately 30 minutes). For this reason, participants maybe considered the additional motivation attempts superfluous.
Online-Befragungen weisen einen Mangel an sozialer Interaktion auf. In der einschlägigen Literatur wird dies zumeist als Vorteil angesehen (vgl. z.B. Tourangeau, Rips & Rasinski, 2000). Unseres Erachtens darf man die menschliche Kommunikation jedoch nicht einseitig als Verzerrungsquelle sehen; sie dient auch als Motivationsquelle. Ihr Fehlen könnte also mitverantwortlich für die internetimmanenten erhöhten Abbruchquoten sein.
Die Social Interface Theory (vgl. z.B. Nass, Moon & Green, 1997) sagt aus, dass bereits marginale menschliche Hinweisreize im Interface Verhaltensweisen hervorrufen, die auch im Face-to-Face-Kontakt gezeigt werden. Auf diesen Effekt vertrauend, versuchten wir in einem experimentellen Setting (n=2046) soziale Interaktion in einer Online-Befragung zu simulieren.
Dies erfolgte 1) durch eine direkte Ansprache der Befragten. In diesen zusätzlichen Texten, die auf mehreren Fragebogenseiten eingeblendet wurden, wurde den Befragten beispielsweise für ihre Teilnahmebereitschaft gedankt oder Verständnis gezeigt, wenn es galt, umfangreiche Matrixfragen zu beantworten. Als Abstufungen wurden die Texte entweder von 2) einem statischen oder 3) einem animierten virtuellen Menschen (Avatar) präsentiert. Per Zufall wurde dabei jeweils einer von zehn verschiedenen Avataren ausgewählt, die sich unter anderem in ihrem Geschlecht und Alter unterschieden. Zudem erfolgte in einem 3x2-Design eine Ansprache der Befragten mit ihrem Namen. Eine Bedingung ohne zusätzliche Texte fungierte als Kontrollbedingung.
Es kann weder ein signifikanter Einfluss auf das Abbruchverhalten (Quote, Zeitpunkt sowie Ort des Abbruchs) noch auf wichtige Aspekte des Antwortverhaltens (u.a. Soziale Erwünschtheit, Indikatoren der Datenqualität sowie Bewertung des Fragebogens) festgestellt werden. Es finden sich signifikante Interaktionseffekte zwischen dem Geschlecht der Avatare und dem Geschlecht der Befragten in Bezug auf einige Fragen zu Geschlechterrollen. Zudem zeigt sich, dass Befragte mit einem hohen sozialen Diskomfort negativer auf die menschlichen Hinweisreize reagieren als Personen, die angeben, sich in Gegenwart Fremder wohl zu fühlen. Ähnliches gilt für männliche sowie jüngere Befragte.
Die nicht hypothesenkonformen Ergebnisse führen wir unter anderem darauf zurück, dass das Interesse der Befragten an dem Thema sehr hoch war. Dies zeigt sich auch in der, trotz hoher Fragebogenlänge (ca. 30 Minuten), sehr geringen Abbruchquote. Zusätzliche Motivierungsversuche wurden deshalb ggf. als überflüssig erachtet.
General online research (GOR) 2008 (abstract)
Web survey bibliography (388)
- A Meta-Analysis of the Effects of Incentives on Response Rate in Online Survey Studies; 2017; Mohammad Asire, A.
- Fieldwork monitoring and managing with time-related paradata; 2017; Vandenplas, C.
- Push2web or less is more? Experimental evidence from a mixed-mode population survey at the community...; 2017; Neumann, R.; Haeder, M.; Brust, O.; Dittrich, E.; von Hermanni, H.
- Rates, Delays, and Completeness of General Practitioners’ Responses to a Postal Versus Web-Based...; 2017; Sebo, P.; Maisonneuve, H.; Cerutti, B.; Pascal Fournier, J.; Haller, D. M.
- Targeted letters: Effects on sample composition and item non-response; 2017; Bianchi, A.; Biffignandi, S.
- Improving survey response rates: The effect of embedded questions in web survey email Invitations; 2017; Liu, M.; Inchausti, N.
- Enhancing survey participation: Facebook advertisements for recruitment in educational research; 2017; Forgasz, H.; Tan, H.; Leder, G.; McLeod, A.
- Overview: Online Surveys; 2017; Vehovar, V.; Lozar Manfreda, K.
- “Better do not touch” and other superstitions concerning melanoma: the cross-sectional web...; 2016; Gajda, M.; Kamińska-Winciorek, G.; Wydmański, J.; Tukiendorf, A.
- Targeted Appeals for Participation in Letters to Panel Survey Members; 2016; Lynn, P.
- Population Survey Features and Response Rates: A Randomized Experiment; 2016; Guo, Y.; Kopec, J.; Cibere, J.; Li, L. C.; Goldsmith, C. H.
- The Effects of a Delayed Incentive on Response Rates, Response Mode, Data Quality, and Sample Bias in...; 2016; McGonagle, K., Freedman, V. A.
- Can Student Populations in Developing Countries Be Reached by Online Surveys? The Case of the National...; 2016; Langer, A., Meuleman, B., Oshodi, A.-G. T., Schroyens, M.
- How to maximize survey response rates ; 2016; DeVall, R.; Colby, C.
- Impact of Field Period Length and Contact Attempts on Representativeness for Web Survey ; 2016; Bertoni, N.; Turakhia, C.; Magaw, R.; Ackermann, A.
- Have You Taken Your Survey Yet? Optimum Interval for Reminders in Web Panel Surveys ; 2016; Kanitkar, K. N.; Liu, D.
- User Experience and Eye-tracking: Results to Optimize Completion of a Web Survey and Website Design ; 2016; Walton, L.; Ricci, K.; Libman Barry, A.; Eiginger, C.; Christian, L. M.
- A Multi-phase Exploration Into Web-based Panel Respondents: Assessing Differences in Recruitment, Respondents...; 2016; Redlawsk, D.; Rogers, K.; Borie-Holtz, D.
- Exploring the Feasibility of Using Facebook for Surveying Special Interest Populations ; 2016; Lee, C.; Jang, S.
- National Estimates of Sexual Minority Women Alcohol Use through Web Based Respondent Driven Sampling...; 2016; Farrell Middleton, D.; Iachan, R.; Freedner-Maguire, N.; Trocki, K.; Evans, C.
- User Experience Considerations for Contextual Product Surveys on Smartphones ; 2016; Sedley, A.; Mueller, H.
- Web Probing for Question Evaluation: The Effects of Probe Placement ; 2016; Fowler, S.; Willis, G. B.; Moser, R. P.; Townsend, R. L. M.; Maitland, A.; Sun, H.; Berrigan, D.
- Early-bird Incentives: Results From an Experiment to Determine Response Rate and Cost Effects ; 2016; De Santis, J.; Callahan, R.; Marsh, S.; Perez-Johnson, I.
- Effects of an Initial Offering of Multiple Survey Response Options on Response Rates; 2016; Steele, E. A.; Marlar, J.; Allen, L.; Kanitkar, K. N.
- How to Invite? Methods for Increasing Internet Surv ey Response Rate ; 2016; Huang, A. R.; Noel, H.; Hargraves, L.
- Reaching the Mobile Generation: Reducing Web Survey Non-response through SMS Reminders ; 2016; Kanitkar, K. N.; Marlar, J.
- "Don't be Afraid ... We're Researchers!": The Impact of Informal Contact Language...; 2016; Foster, K. N.; Hagemeier, N. E.; Alamain, A. A.; Pack, R.; Sevak, R. J.
- Does Embedding a Survey Question in the Survey Invi tation E-mail Affect Response Rates? Evidence from...; 2016; Vannette, D.
- Communication Channels that Predict and Mediate Self-response ; 2016; Walejko, G. K.
- Ballpoint Pens as Incentives with Mail Questionnaires – Results of a Survey Experiment; 2016; Heise, M.
- Non-Observation Bias in an Address-Register-Based CATI/CAPI Mixed Mode Survey; 2016; Lipps, O.
- Pre-Survey Text Messages (SMS) Improve Participation Rate in an Australian Mobile Telephone Survey:...; 2016; Dal Grande, E.; Chittleborough, C. R.; Campostrini, S.; Dollard, M.; Taylor, A. W.
- Effects of Personalization and Invitation Email Length on Web-Based Survey Response Rates; 2016; Trespalacios, J. H.; Perkins, R. A.
- Assessing targeted approach letters: effects in different modes on response rates, response speed and...; 2016; Lynn, P.
- Refining the Web Response Option in the Multiple Mode Collection of the American Community Survey; 2016; Hughes, T.; Tancreto, J.
- Setting Up an Online Panel Representative of the General Population The German Internet Panel; 2016; Blom, A. G.; Gathmann, C.; Krieger, U.
- Sample Representation and Substantive Outcomes Using Web With and Without Incentives Compared to Telephone...; 2016; Lipps, O.; Pekari, N.
- Collecting Data from mHealth Users via SMS Surveys: A Case Study in Kenya; 2016; Johnson, D.
- “Money Will Solve the Problem”: Testing the Effectiveness of Conditional Incentives for...; 2016; DeCamp, W.; Manierre, M. J.
- Effects of Incentive Amount and Type of Web Survey Response Rates; 2016; Coopersmith, J.; Vogel, L. K.; Bruursema, T.; Feeney, K.
- Effect of a Post-paid Incentive on Response to a Web-based Survey; 2016; Brown, J. A.; Serrato, C. A.; Hugh, M.; Kanter, M. H.; A.; Spritzer, K. L.; Hays, R. D.
- Reminder Effect and Data Usability on Web Questionnaire Survey for University Students; 2016; Oishi, T.; Mori, M.; Takata, E.
- Is One More Reminder Worth It? If So, Pick Up the Phone: Findings from a Web Survey; 2016; Lin-Freeman, L.
- Take the money and run? Redemption of a gift card incentive in a clinician survey. ; 2016; Chen, J. S.; Sprague, B. L.; Klabunde, C. N.; Tosteson, A. N. A.; Bitton, A.; Onega, T.; MacLean, C....
- The effect of email invitation elements on response rate in a web survey within an online community; 2016; Petrovcic, A.; Petric, G.; Lozar Manfreda, K.
- A reliability analysis of Mechanical Turk data; 2016; Rouse, S. V.
- Doing Surveys Online ; 2016; Toepoel, V.
- A Privacy-Friendly Method to Reward Participants of Online-Surveys; 2015; Herfert, M.; Lange, B.; Selzer, A.; Waldmann, U.
- Incentive Types and Amounts in a Web-based Survey of College Students; 2015; Krebs, C.; Planty, M.; Stroop, J.; Berzofsky, M.; Lindquist, C.
- Using Mobile Phones for High-Frequency Data Collection; 2015; Azevedo, J. P.; Ballivian, A.; Durbin, W.